Skip to main content

@pexip/media-processor

A library for media analysis using Web APIs.

Enumerations

Interfaces

Type Aliases

AnalyzerNodeInit

Ƭ AnalyzerNodeInit: AudioNodeInit<AnalyserNode, Analyzer>


AsyncCallback

Ƭ AsyncCallback: () => Promise<void>

Type declaration

▸ (): Promise<void>

Returns

Promise<void>


AudioBufferBytes

Ƭ AudioBufferBytes: Uint8Array

Same as the return from AnalyserNode.getByteFrequencyData()


AudioBufferFloats

Ƭ AudioBufferFloats: Float32Array

Same as AudioBuffer Or the return from AnalyserNode.getFloatFrequencyData()


AudioDestinationNodeInit

Ƭ AudioDestinationNodeInit: AudioNodeInit<AudioDestinationNode, AudioDestinationNode>


AudioNodeConnectParam

Ƭ AudioNodeConnectParam: ConnectParamBase<ConnectParamType>


AudioNodeInitConnectParam

Ƭ AudioNodeInitConnectParam: ConnectParamBase<ConnectInitParamType>


AudioNodeInitConnection

Ƭ AudioNodeInitConnection: AudioNodeInitConnectParam[]


AudioNodeInitConnections

Ƭ AudioNodeInitConnections: AudioNodeInitConnection[]


AudioNodeParam

Ƭ AudioNodeParam: BaseAudioNode | AudioParam


AudioSamples

Ƭ AudioSamples: number[] | AudioBufferFloats | AudioBufferBytes

Audio samples from each channel, either in float or bytes form

Example

|       |            | Sample Frame 1 | Sample Frame 2 | Sample Frame 3 |
| Input | Channel L: | sample 1 | sample 2 | sample 3 |
| | Channel R: | sample 1 | sample 2 | sample 3 |

We can get 2 AudioSamples from "Channel L" and "Channel R" from "Input"


BaseAudioNode

Ƭ BaseAudioNode: Pick<AudioNode, NodeConnectionAction>


Callback

Ƭ Callback<R, T>: (...params: T) => R

Type parameters

NameType
RR
Textends unknown[]

Type declaration

▸ (...params): R

Parameters
NameType
...paramsT
Returns

R


Canvas

Ƭ Canvas: HTMLCanvasElement | OffscreenCanvas


CanvasContext

Ƭ CanvasContext: CanvasRenderingContext2D | OffscreenCanvasRenderingContext2D


ChannelSplitterNodeInit

Ƭ ChannelSplitterNodeInit: AudioNodeInit<ChannelSplitterNode, ChannelSplitterNode>


Color

Ƭ Color: Object

Type declaration

NameType
anumber
bnumber
gnumber
rnumber

ConnectInitParamBaseType

Ƭ ConnectInitParamBaseType: AudioParam | AudioNodeInit


ConnectInitParamType

Ƭ ConnectInitParamType: ConnectInitParamBaseType | undefined


ConnectParamBase

Ƭ ConnectParamBase<T>: [T, T extends undefined ? undefined : AudioNodeOutputIndex, T extends undefined ? undefined : AudioNodeInputIndex] | T

Type parameters

NameType
Textends ConnectParamType | ConnectInitParamType

ConnectParamBaseType

Ƭ ConnectParamBaseType: AudioParam | BaseAudioNode


ConnectParamType

Ƭ ConnectParamType: ConnectParamBaseType | undefined


DelayNodeInit

Ƭ DelayNodeInit: AudioNodeInit<DelayNode, DelayNode>


DenoiseWorkletNodeInit

Ƭ DenoiseWorkletNodeInit: AudioNodeInit<AudioWorkletNode>


GainNodeInit

Ƭ GainNodeInit: AudioNodeInit<GainNode, Gain>


ImageType

Ƭ ImageType: CanvasImageSource | ProcessInputType | VideoFrame


InputFrame

Ƭ InputFrame: CanvasImageSource | VideoFrame


IsVoice

Ƭ IsVoice<T>: (data: T) => boolean

Type parameters

Name
T

Type declaration

▸ (data): boolean

Parameters
NameType
dataT
Returns

boolean


MaskUnderlyingType

Ƭ MaskUnderlyingType: "canvasimagesource" | "imagedata" | "tensor"

Interfaces from "tensorflow-models/body-segmentation" interfaces


MediaElementAudioSourceNodeInit

Ƭ MediaElementAudioSourceNodeInit: AudioNodeInit<MediaElementAudioSourceNode, MediaElementAudioSourceNode>


MediaStreamAudioDestinationNodeInit

Ƭ MediaStreamAudioDestinationNodeInit: AudioNodeInit<MediaStreamAudioDestinationNode, MediaStreamAudioDestinationNode>


MediaStreamAudioSourceNodeInit

Ƭ MediaStreamAudioSourceNodeInit: AudioNodeInit<MediaStreamAudioSourceNode, MediaStreamAudioSourceNode>


Node

Ƭ Node: AudioNode | AudioParam


NodeConnectionAction

Ƭ NodeConnectionAction: "connect" | "disconnect"


Nodes

Ƭ Nodes: Node[]


ProcessInputType

Ƭ ProcessInputType: ImageData | HTMLVideoElement | HTMLImageElement | OffscreenCanvas | HTMLCanvasElement | ImageBitmap


ProcessStatus

Ƭ ProcessStatus: "created" | "opened" | "opening" | "processing" | "idle" | "closed" | "destroying" | "destroyed"


ProcessVideoTrack

Ƭ ProcessVideoTrack: (track: MediaStreamVideoTrack, transformers: Transformer<InputFrame, InputFrame>[], options?: Options) => Promise<Track>

Type declaration

▸ (track, transformers, options?): Promise<Track>

Parameters
NameType
trackMediaStreamVideoTrack
transformersTransformer<InputFrame, InputFrame>[]
options?Options
Returns

Promise<Track>


Rect

Ƭ Rect: Point & Size


RenderEffects

Ƭ RenderEffects: typeof RENDER_EFFECTS[number]


RunnerCreator

Ƭ RunnerCreator<P, R>: (callback: Callback<R, P>, frameRate: number) => Runner<P>

Type parameters

NameType
Pextends unknown[]
RR

Type declaration

▸ (callback, frameRate): Runner<P>

Parameters
NameType
callbackCallback<R, P>
frameRatenumber
Returns

Runner<P>


SegmentationModel

Ƭ SegmentationModel: typeof SEG_MODELS[number]


SegmentationTransform

Ƭ SegmentationTransform: Transform<InputFrame, InputFrame> & SegmentationParams & Omit<Process, "open">


UniversalAudioContextState

Ƭ UniversalAudioContextState: AudioContextState | "interrupted"

We need to add the missing type def to work with AudioContextState in Safari See https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/state#resuming_interrupted_play_states_in_ios_safari


Unsubscribe

Ƭ Unsubscribe: () => void

Type declaration

▸ (): void

Unsubscribe the subscription

Returns

void


WasmPaths

Ƭ WasmPaths: [string, string | undefined]

Variables

BACKGROUND_BLUR_AMOUNT

Const BACKGROUND_BLUR_AMOUNT: 3


CLIP_COUNT_THRESHOLD

Const CLIP_COUNT_THRESHOLD: 6

Default clipping count threshold Number of consecutive clipThreshold level samples that indicate clipping.


CLIP_THRESHOLD

Const CLIP_THRESHOLD: 0.98

Default clipping detection threshold


EDGE_BLUR_AMOUNT

Const EDGE_BLUR_AMOUNT: 3


FLIP_HORIZONTAL

Const FLIP_HORIZONTAL: false


FOREGROUND_THRESHOLD

Const FOREGROUND_THRESHOLD: 0.5


FRAME_RATE

Const FRAME_RATE: 20


LOW_VOLUME_THRESHOLD

Const LOW_VOLUME_THRESHOLD: -60

Default low volume detection threshold


MONO_THRESHOLD

Const MONO_THRESHOLD: number

Default mono detection threshold Data must be identical within one LSB 16-bit to be identified as mono.


PROCESSING_HEIGHT

Const PROCESSING_HEIGHT: 432


PROCESSING_WIDTH

Const PROCESSING_WIDTH: 768


RENDER_EFFECTS

Const RENDER_EFFECTS: readonly ["none", "blur", "overlay"]


SEG_MODELS

Const SEG_MODELS: readonly ["mediapipeSelfie"]


SILENT_THRESHOLD

Const SILENT_THRESHOLD: number

Default silent threshold At least one LSB 16-bit data (compare is on absolute value).


VOICE_PROBABILITY_THRESHOLD

Const VOICE_PROBABILITY_THRESHOLD: 0.3

Default Voice probability threshold


urls

Const urls: Object

Type declaration

NameType
denoise() => URL

Functions

avg

avg(nums): number

Average an array of numbers

Parameters

NameTypeDescription
numsnumber[]An array of numbers

Returns

number


calculateDistance

calculateDistance(p1, p2): number

Calculate the distance between two Points

Parameters

NameTypeDescription
p1PointPoint 1
p2PointPoint 2

Returns

number


calculateFps

calculateFps(time): number

Parameters

NameType
timenumber

Returns

number


closedCurve

closedCurve(«destructured»): (data: Point[]) => string

Create a cubic Bezier curve path then turning back to the starting point with provided point of reference

Example

closedCurve({x:0, y:20})([{x:0, y:0}, {x:3, y:4}, {x:9, y:16}]);
// Output:
// M 0,0 C 0,0 1.778263374435667,1.8280237767745193 3,4 C 6.278263374435667,9.828023776774518 9,16 9,16 V 20 H 0 Z

Parameters

NameType
«destructured»Point

Returns

fn

▸ (data): string

Parameters
NameType
dataPoint[]
Returns

string


copyByteBufferToFloatBuffer

copyByteBufferToFloatBuffer(bytes, floats): void

Copy data from Uint8Array buffer to Float32Array buffer with byte to float conversion

Parameters

NameTypeDescription
bytesUint8ArrayThe source Byte buffer
floatsFloat32ArrayThe destination buffer

Returns

void


createAnalyzerGraphNode

createAnalyzerGraphNode(options?): AudioNodeInit<AnalyserNode, Analyzer>

Create an AnalyzerNodeInit

See

AnalyserOptions

Parameters

NameType
options?AnalyserOptions

Returns

AudioNodeInit<AnalyserNode, Analyzer>


createAnalyzerSubscribableGraphNode

createAnalyzerSubscribableGraphNode(«destructured»): AudioNodeInit<AnalyserNode, Analyzer>

Create an analyzer node with push-based subscription

Parameters

NameType
«destructured»AnalyzerSubscribableOptions & AnalyserOptions

Returns

AudioNodeInit<AnalyserNode, Analyzer>


createAsyncCallbackLoop

createAsyncCallbackLoop<P, R>(callback, frameRate, «destructured»?): Object

Create an async callback loop to be called recursively with delay based on the frameRate

Type parameters

NameType
Pextends unknown[]
Rextends Promise<unknown, R>

Parameters

NameTypeDescription
callbackCallback<R, P>The callback to be invoked
frameRatenumberThe rate to be expected to invoke the callback
«destructured»Partial<AsyncCallbackLoopOptions>-

Returns

Object

NameType
start(...params: P) => Promise<void>
stop() => void
get frameRate()number

createAudioContext

createAudioContext(options?): AudioContext

A function to create AudioContext using constructor or factory function depends on the browser supports

See

AudioContextOptions

Parameters

NameType
options?AudioContextOptions

Returns

AudioContext


createAudioDestinationGraphNode

createAudioDestinationGraphNode(): AudioNodeInit<AudioDestinationNode, AudioDestinationNode>

Create an AudioDestinationNode

Returns

AudioNodeInit<AudioDestinationNode, AudioDestinationNode>


createAudioGraph

createAudioGraph(initialConnections, options?): AudioGraph

Accepts AudioNodeInitConnections to build the audio graph within a signal audio context

See

AudioGraphOptions

Example

const source = createStreamSourceGraphNode(stream);
const analyzer = createAnalyzerGraphNode({fftSize});
const audioGraph = createAudioGraph([[source, analyzer]]);

Parameters

NameTypeDescription
initialConnectionsAudioNodeInitConnectionsA list of AudioNodeInit to build the graph in a linear fashion
optionsAudioGraphOptions

Returns

AudioGraph


createAudioGraphProxy

createAudioGraphProxy(audioGraph, handlers): AudioGraph

Parameters

NameType
audioGraphAudioGraph
handlersAudioGraphProxyHandlers

Returns

AudioGraph


createAudioSignalDetector

createAudioSignalDetector(shouldDetect, onDetected): (buffer: Queue<number[]>, threshold?: number) => (samples: number[]) => void

Create a function to process the AudioStats and check if silent onSignalDetected callback is called under 2 situations:

Logic
lastCheck | silent | should call onSignalDetected
0 | 0 | 0
0 | 1 | 1
1 | 0 | 1
1 | 1 | 0

Parameters

NameType
shouldDetect() => boolean
onDetected(silent: boolean) => void

Returns

fn

▸ (buffer, threshold?): (samples: number[]) => void

Parameters
NameType
bufferQueue<number[]>
threshold?number
Returns

fn

▸ (samples): void

Parameters
NameType
samplesnumber[]
Returns

void


createAudioStats

createAudioStats(stats?, options?): AudioStats

AudioStats builder

Parameters

NameTypeDescription
statsPartial<AudioStats>overwrite the default attributes
optionsObjectsilentThreshold, lowVolumeThreshold and clipCountThreshold
options.clipCountThreshold?number-
options.lowVolumeThreshold?number-
options.silentThreshold?number-

Returns

AudioStats


createBenchmark

createBenchmark(clock?, «destructured»?): Benchmark

Parameters

NameTypeDefault value
clockClockperformance
«destructured»BenchmarkOptions{}

Returns

Benchmark


createCanvasTransform

createCanvasTransform(segmenter, «destructured»?): SegmentationTransform

Parameters

NameType
segmenterSegmenter
«destructured»Partial<Options>

Returns

SegmentationTransform


createChannelMergerGraphNode

createChannelMergerGraphNode(options?): AudioNodeInit<ChannelMergerNode, ChannelMergerNode>

Create a ChannelMergerNode

See

ChannelMergerOptions

Parameters

NameType
options?ChannelMergerOptions

Returns

AudioNodeInit<ChannelMergerNode, ChannelMergerNode>


createChannelSplitterGraphNode

createChannelSplitterGraphNode(options?): AudioNodeInit<ChannelSplitterNode, ChannelSplitterNode>

Create a ChannelSplitterNode

See

ChannelSplitterOptions

Parameters

NameType
options?ChannelSplitterOptions

Returns

AudioNodeInit<ChannelSplitterNode, ChannelSplitterNode>


createDelayGraphNode

createDelayGraphNode(options?): AudioNodeInit<DelayNode, DelayNode>

Create a DelayNode

See

DelayOptions

Parameters

NameType
options?DelayOptions

Returns

AudioNodeInit<DelayNode, DelayNode>


createDenoiseWorkletGraphNode

createDenoiseWorkletGraphNode(data, messageHandler?): AudioNodeInit<AudioWorkletNode, AudioWorkletNode>

Create a noise suppression node

Parameters

NameTypeDescription
dataBufferSourceWebAssembly source
messageHandler?(vads: number[]) => void-

Returns

AudioNodeInit<AudioWorkletNode, AudioWorkletNode>


createFrameCallbackRequest

createFrameCallbackRequest(callback, frameRate, «destructured»?): Object

Create a callback loop for video frame processing using requestVideoFrameCallback under-the-hood when available otherwise our fallback implementation based on setTimeout.

Parameters

NameTypeDescription
callbackCallback<Promise<void>, [ProcessInputType]>To be called by the loop
frameRatenumberA fallback frame rate when we are not able to get the rate from API
«destructured»FrameCallbackRequestOptions-

Returns

Object

NameType
start(input: ProcessInputType) => Promise<void>
stop() => void
get frameRate()number

createGainGraphNode

createGainGraphNode(mute): AudioNodeInit<GainNode, Gain>

Create a GainNodeInit

Parameters

NameTypeDescription
mutebooleaninitial mute state

Returns

AudioNodeInit<GainNode, Gain>


createMediaElementSourceGraphNode

createMediaElementSourceGraphNode(mediaElement): AudioNodeInit<MediaElementAudioSourceNode, MediaElementAudioSourceNode>

Create a MediaStreamAudioSourceNodeInit

Parameters

NameType
mediaElementHTMLMediaElement

Returns

AudioNodeInit<MediaElementAudioSourceNode, MediaElementAudioSourceNode>


createMediapipeSegmenter

createMediapipeSegmenter(basePath?, «destructured»?): Segmenter

Parameters

NameTypeDefault value
basePathstring'/'
«destructured»Partial<Options>{}

Returns

Segmenter


createStreamDestinationGraphNode

createStreamDestinationGraphNode(options?): AudioNodeInit<MediaStreamAudioDestinationNode, MediaStreamAudioDestinationNode>

Create a MediaStreamAudioDestinationNodeInit

Parameters

NameType
options?AudioNodeOptions

Returns

AudioNodeInit<MediaStreamAudioDestinationNode, MediaStreamAudioDestinationNode>


createStreamSourceGraphNode

createStreamSourceGraphNode(mediaStream, shouldResetEnabled?): AudioNodeInit<MediaStreamAudioSourceNode, MediaStreamAudioSourceNode>

Create a MediaStreamAudioSourceNodeInit

Parameters

NameTypeDefault valueDescription
mediaStreamMediaStreamundefinedSource MediaStream
shouldResetEnabledbooleantrueWhether or not to enable the cloned track

Returns

AudioNodeInit<MediaStreamAudioSourceNode, MediaStreamAudioSourceNode>


createVADetector

createVADetector(onDetected, shouldDetect, options?): <T>(isVoice: IsVoice<T>) => (data: T) => void

Create a voice detector based on provided params

See

ThrottleOptions

Parameters

NameTypeDescription
onDetected() => voidWhen there is voice activity, this callback will be called
shouldDetect() => booleanWhen return true, voice activity will function, otherwise, not function
options?ThrottleOptions

Returns

fn

▸ <T>(isVoice): (data: T) => void

Type parameters
Name
T
Parameters
NameType
isVoiceIsVoice<T>
Returns

fn

▸ (data): void

Parameters
NameType
dataT
Returns

void


createVideoProcessor

createVideoProcessor(transformers, processTrack): VideoProcessor

Parameters

NameType
transformersTransform<InputFrame, InputFrame>[]
processTrackProcessVideoTrack

Returns

VideoProcessor


createVideoTrackProcessor

createVideoTrackProcessor(): ProcessVideoTrack

Returns

ProcessVideoTrack


createVideoTrackProcessorWithFallback

createVideoTrackProcessorWithFallback(«destructured»?): ProcessVideoTrack

Parameters

NameType
«destructured»FallbackOptions

Returns

ProcessVideoTrack


createVoiceDetectorFromProbability

createVoiceDetectorFromProbability(voiceThreshold?): IsVoice<number>

A function to check the provided probability is considered as voice activity

Parameters

NameTypeDefault valueDescription
voiceThresholdnumberVOICE_PROBABILITY_THRESHOLDA threshold of the probability to be considered as voice activity

Returns

IsVoice<number>


createVoiceDetectorFromTimeData

createVoiceDetectorFromTimeData(options?): IsVoice<number[]>

A function to check provided time series data is considered as voice activity

See

VAOptions

Parameters

NameType
optionsVAOptions

Returns

IsVoice<number[]>


curve

curve(data): string

Create a cubic Bezier curve path command

Example

curve([{x:0, y:0}, {x:3, y:4}, {x:9, y:16}]);
// Output:
// M 0,0 C 0,0 1.778263374435667,1.8280237767745193 3,4 C 6.278263374435667,9.828023776774518 9,16 9,16

Parameters

NameTypeDescription
dataPoint[]An array of Points

Returns

string


fitDestinationSize

fitDestinationSize(sw, sh, dw, dh): Rect

Convert the source size to destination size when necessary based on the height

Parameters

NameTypeDescription
swnumberSource width
shnumberSource height
dwnumberdestination width
dhnumberdestination height

Returns

Rect


fromByteToFloat

fromByteToFloat(value): number

Convert a byte to float, according to web audio spec

Floating point audio sample number is defined as: non-interleaved IEEE754 32-bit linear PCM with a nominal range between -1 and +1, that is, 32bits floating point buffer, with each samples between -1.0 and 1.0 https://developer.mozilla.org/en-US/docs/Web/API/AudioBuffer

Byte samples are represented as follows: 128 is silence, 0 is negative max, 256 is positive max

Remarks

Ref. https://www.w3.org/TR/webaudio/#dom-analysernode-getbytetimedomaindata

Parameters

NameTypeDescription
valuenumberThe byte value to convert to float

Returns

number


fromFloatToByte

fromFloatToByte(value): number

Convert a float to byte, according to web audio spec

Floating point audio sample number is defined as: non-interleaved IEEE754 32-bit linear PCM with a nominal range between -1 and +1, that is, 32bits floating point buffer, with each samples between -1.0 and 1.0 https://developer.mozilla.org/en-US/docs/Web/API/AudioBuffer

Byte samples are represented as follows: 128 is silence, 0 is negative max, 256 is positive max

Remarks

Ref. https://www.w3.org/TR/webaudio/#dom-analysernode-getbytetimedomaindata

Parameters

NameTypeDescription
valuenumberThe float value to convert to byte

Returns

number


getAudioStats

getAudioStats(options): AudioStats

Calculate the audio stats, expected the samples are in float form

Remarks

http://www.rossbencina.com/code/real-time-audio-programming-101-time-waits-for-nothing

Parameters

NameTypeDescription
optionsStatsOptionsSee StatsOptions

Returns

AudioStats


getBezierCurveControlPoints

getBezierCurveControlPoints(«destructured»): [Point, Point]

Spline Interpolation for Bezier Curve

Remarks

Ref. http://scaledinnovation.com/analytics/splines/aboutSplines.html Alt. https://www.particleincell.com/2012/bezier-splines/

Parameters

NameType
«destructured»Object
› p1Point
› p2Point
› p3Point
› tnumber

Returns

[Point, Point]


isAnalyzerNodeInit

isAnalyzerNodeInit(t): t is AnalyzerNodeInit

Parameters

NameType
tunknown

Returns

t is AnalyzerNodeInit


isAudioNode

isAudioNode(t): t is AudioNode

Parameters

NameType
tunknown

Returns

t is AudioNode


isAudioNodeInit

isAudioNodeInit(t): t is AudioNodeInit<AudioNode, BaseAudioNode>

Parameters

NameType
tunknown

Returns

t is AudioNodeInit<AudioNode, BaseAudioNode>


isAudioParam

isAudioParam(t): t is AudioParam

Parameters

NameType
tunknown

Returns

t is AudioParam


isClipping

isClipping(clipCount, threshold?): boolean

Check if there is clipping

Parameters

NameTypeDefault valueDescription
clipCountnumberundefinedNumber of consecutive clip
thresholdnumberCLIP_COUNT_THRESHOLD-

Returns

boolean

true if the clipCount is above the threshold, aka clipping


isEqualSize

isEqualSize(widthA, heightA, widthB, heightB): boolean

Compare the provided width and height to see if they are the same

Parameters

NameTypeDescription
widthAnumberThe width of A
heightAnumberThe height of A
widthBnumberThe width of B
heightBnumberThe height of B

Returns

boolean


isLowVolume

isLowVolume(gain, threshold?): boolean

Check if the provided gain above the low volume threshold, which is considered as low volume.

Parameters

NameTypeDefault valueDescription
gainnumberundefinedFloating point representation of the gain number
thresholdnumberLOW_VOLUME_THRESHOLD-

Returns

boolean

true if the gain is lower than the threshold


isMono

isMono(channels, threshold?): boolean

Check if provided channels are mono or stereo

Default Value

1.0 / 32767

Parameters

NameTypeDefault valueDescription
channelsAudioSamples[]undefinedAudio channels and assuming the inputs are in floating point form
thresholdnumberMONO_THRESHOLDMono detection threshold, default to floating point form

Returns

boolean

true if they are mono, otherwise stereo


isRenderEffects

isRenderEffects(t): t is "blur" | "none" | "overlay"

Parameters

NameType
tunknown

Returns

t is "blur" | "none" | "overlay"


isSegmentationModel

isSegmentationModel(t): t is "mediapipeSelfie"

Parameters

NameType
tunknown

Returns

t is "mediapipeSelfie"


isSilent

isSilent(samples, threshold?): boolean

Simple silent detection to only check the first and last bit from the sample

Default Value

1.0 / 32767 assuming the sample is float value

Parameters

NameTypeDefault valueDescription
samplesAudioSamplesundefinedAudio sample data, this could be in a form of floating number of a byte number as long as the threshold value is given accordingly.
thresholdnumberSILENT_THRESHOLDSilent threshold

Returns

boolean

true when it is silent


isVoiceActivity

isVoiceActivity(options?): (volume: number) => boolean

A Naive Voice activity detection

Parameters

NameTypeDescription
optionsVAOptionsSee VAOptions

Returns

fn

(volume: number) => boolean, true if there is voice

▸ (volume): boolean

Parameters
NameType
volumenumber
Returns

boolean


line

line(data): string

Create a straight line path command

Example

line([{x:0, y:0}, {x:2, y:2}]);
// Output:
// M 0,0 L 2,2

Parameters

NameTypeDescription
dataPoint[]An array of Points

Returns

string


loadScript

loadScript(path, id): Promise<void>

Parameters

NameType
pathstring
idstring

Returns

Promise<void>


loadTfjsBackendWebGl

loadTfjsBackendWebGl(): Promise<{ GPGPUContext: ; MathBackendWebGL: ; gpgpu_util: ; setWebGLContext: ; version_webgl: ; webgl_util: ; default: { gpgpu_util: { bindVertexProgramAttributeStreams: (gl: WebGLRenderingContext, program: WebGLProgram, vertexBuffer: WebGLBuffer) => boolean ; createBufferFromOutputTexture: (gl2: WebGL2RenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => WebGLBuffer ; createFloat16MatrixTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Texture ; createFloat16PackedMatrixTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Texture ; createFloat32MatrixTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Texture ; createIndexBuffer: (gl: WebGLRenderingContext) => WebGLBuffer ; createPackedMatrixTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Texture ; createUnsignedBytesMatrixTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Texture ; createVertexBuffer: (gl: WebGLRenderingContext) => WebGLBuffer ; createVertexShader: (gl: WebGLRenderingContext) => WebGLShader ; downloadByteEncodedFloatMatrixFromOutputTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Float32Array ; downloadFloat32MatrixFromBuffer: (gl: WebGLRenderingContext, buffer: WebGLBuffer, size: number) => Float32Array ; downloadMatrixFromPackedOutputTexture: (gl: WebGLRenderingContext, physicalRows: number, physicalCols: number) => Float32Array ; downloadPackedMatrixFromBuffer: (gl: WebGLRenderingContext, buffer: WebGLBuffer, batch: number, rows: number, cols: number, physicalRows: number, physicalCols: number, textureConfig: TextureConfig) => Float32Array ; getInternalFormatForFloat16MatrixTexture: (textureConfig: TextureConfig) => number ; getInternalFormatForFloat16PackedMatrixTexture: (textureConfig: TextureConfig) => number ; getInternalFormatForFloat32MatrixTexture: (textureConfig: TextureConfig) => number ; getInternalFormatForPackedMatrixTexture: (textureConfig: TextureConfig) => number ; getInternalFormatForUnsignedBytesMatrixTexture: (textureConfig: TextureConfig) => number ; uploadDenseMatrixToTexture: (gl: WebGLRenderingContext, texture: WebGLTexture, width: number, height: number, data: TypedArray, textureConfig: TextureConfig) => void ; uploadPixelDataToTexture: (gl: WebGLRenderingContext, texture: WebGLTexture, pixels: HTMLCanvasElement | HTMLVideoElement | HTMLImageElement | ImageBitmap | ImageData | PixelData) => void } ; webgl_util: { assertNotComplex: (tensor: TensorInfo | TensorInfo[], opName: string) => void ; bindCanvasToFramebuffer: (gl: WebGLRenderingContext) => void ; bindColorTextureToFramebuffer: (gl: WebGLRenderingContext, texture: WebGLTexture, framebuffer: WebGLFramebuffer) => void ; bindTextureToProgramUniformSampler: (gl: WebGLRenderingContext, texture: WebGLTexture, uniformSamplerLocation: WebGLUniformLocation, textureUnit: number) => void ; bindTextureUnit: (gl: WebGLRenderingContext, texture: WebGLTexture, textureUnit: number) => void ; bindVertexBufferToProgramAttribute: (gl: WebGLRenderingContext, program: WebGLProgram, attribute: string, buffer: WebGLBuffer, arrayEntriesPerItem: number, itemStrideInBytes: number, itemOffsetInBytes: number) => boolean ; callAndCheck: <T>(gl: WebGLRenderingContext, func: () => T) => T ; canBeRepresented: (num: number) => boolean ; createFragmentShader: (gl: WebGLRenderingContext, fragmentShaderSource: string) => WebGLShader ; createFramebuffer: (gl: WebGLRenderingContext) => WebGLFramebuffer ; createProgram: (gl: WebGLRenderingContext) => WebGLProgram ; createStaticIndexBuffer: (gl: WebGLRenderingContext, data: Uint16Array) => WebGLBuffer ; createStaticVertexBuffer: (gl: WebGLRenderingContext, data: Float32Array) => WebGLBuffer ; createTexture: (gl: WebGLRenderingContext) => WebGLTexture ; createVertexShader: (gl: WebGLRenderingContext, vertexShaderSource: string) => WebGLShader ; getBatchDim: (shape: number[], dimsToSkip?: number) => number ; getExtensionOrThrow: (gl: WebGLRenderingContext, extensionName: string) => {} ; getFramebufferErrorMessage: (gl: WebGLRenderingContext, status: number) => string ; getMaxTexturesInShader: (webGLVersion: number) => number ; getNumChannels: () => number ; getProgramUniformLocation: (gl: WebGLRenderingContext, program: WebGLProgram, uniformName: string) => WebGLUniformLocation ; getProgramUniformLocationOrThrow: (gl: WebGLRenderingContext, program: WebGLProgram, uniformName: string) => WebGLUniformLocation ; getRowsCols: (shape: number[]) => [number, number] ; getShapeAs3D: (shape: number[]) => [number, number, number] ; getTextureShapeFromLogicalShape: (logShape: number[], isPacked?: boolean) => [number, number] ; getWebGLDisjointQueryTimerVersion: (webGLVersion: number) => number ; getWebGLErrorMessage: (gl: WebGLRenderingContext, status: number) => string ; getWebGLMaxTextureSize: (webGLVersion: number) => number ; hasExtension: (gl: WebGLRenderingContext, extensionName: string) => boolean ; isCapableOfRenderingToFloatTexture: (webGLVersion: number) => boolean ; isDownloadFloatTextureEnabled: (webGLVersion: number) => boolean ; isReshapeFree: (shape1: number[], shape2: number[]) => boolean ; isWebGLFenceEnabled: (webGLVersion: number) => boolean ; isWebGLVersionEnabled: (webGLVersion: 2 | 1) => boolean ; linkProgram: (gl: WebGLRenderingContext, program: WebGLProgram) => void ; logShaderSourceAndInfoLog: (shaderSource: string, shaderInfoLog: string) => void ; resetMaxTextureSize: () => void ; resetMaxTexturesInShader: () => void ; unbindColorTextureFromFramebuffer: (gl: WebGLRenderingContext, framebuffer: WebGLFramebuffer) => void ; unbindTextureUnit: (gl: WebGLRenderingContext, textureUnit: number) => void ; validateFramebuffer: (gl: WebGLRenderingContext) => void ; validateProgram: (gl: WebGLRenderingContext, program: WebGLProgram) => void ; validateTextureSize: (width: number, height: number) => void } ; GPGPUContext: { constructor: (gl?: WebGLRenderingContext) => GPGPUContext ; addItemToPoll: any ; bindTextureToFrameBuffer: any ; bindVertexArray: (vao: null | WebGLVao) => void ; colorBufferFloatExtension: {} ; colorBufferHalfFloatExtension: {} ; createFence: any ; createVertexArray: () => null | WebGLVao ; deleteVertexArray: (vao: null | WebGLVao) => void ; disjoint: any ; disjointQueryTimerExtension: WebGL2DisjointQueryTimerExtension | WebGL1DisjointQueryTimerExtension ; disposed: any ; downloadMatrixDriver: any ; framebuffer: WebGLFramebuffer ; getQueryTime: any ; getQueryTimerExtension: any ; getQueryTimerExtensionWebGL1: any ; getQueryTimerExtensionWebGL2: any ; getVertexArray: () => null | WebGLVao ; gl: WebGLRenderingContext ; indexBuffer: WebGLBuffer ; isQueryAvailable: any ; itemsToPoll: any ; outputTexture: null | WebGLTexture ; parallelCompilationExtension: WebGLParallelCompilationExtension ; program: null | GPGPUContextProgram ; setOutputMatrixTextureDriver: any ; setOutputMatrixWriteRegionDriver: any ; textureConfig: TextureConfig ; textureFloatExtension: {} ; textureHalfFloatExtension: {} ; throwIfDisposed: any ; throwIfNoProgram: any ; unbindTextureToFrameBuffer: any ; vertexBuffer: WebGLBuffer ; vertexShader: any ; debug: ; beginQuery: () => WebGLQuery ; blockUntilAllProgramsCompleted: () => void ; buildVao: (program: GPGPUContextProgram) => void ; createAndWaitForFence: () => Promise<void> ; createBufferFromTexture: (texture: WebGLTexture, rows: number, columns: number) => WebGLBuffer ; createFloat16MatrixTexture: (rows: number, columns: number) => Texture ; createFloat16PackedMatrixTexture: (rows: number, columns: number) => Texture ; createFloat32MatrixTexture: (rows: number, columns: number) => Texture ; createPackedMatrixTexture: (rows: number, columns: number) => Texture ; createProgram: (fragmentShader: WebGLShader) => GPGPUContextProgram ; createUnsignedBytesMatrixTexture: (rows: number, columns: number) => Texture ; debugValidate: () => void ; deleteMatrixTexture: (texture: WebGLTexture) => void ; deleteProgram: (program: GPGPUContextProgram) => void ; dispose: () => void ; downloadByteEncodedFloatMatrixFromOutputTexture: (texture: WebGLTexture, rows: number, columns: number) => Float32Array ; downloadFloat32MatrixFromBuffer: (buffer: WebGLBuffer, size: number) => Float32Array ; downloadMatrixFromPackedTexture: (texture: WebGLTexture, physicalRows: number, physicalCols: number) => Float32Array ; downloadPackedMatrixFromBuffer: (buffer: WebGLBuffer, batch: number, rows: number, columns: number, physicalRows: number, physicalCols: number) => Float32Array ; endQuery: () => void ; executeProgram: () => void ; getAttributeLocation: (program: WebGLProgram, attribute: string) => number ; getUniformLocation: (program: WebGLProgram, uniformName: string, shouldThrow?: boolean) => WebGLUniformLocation ; getUniformLocationNoThrow: (program: WebGLProgram, uniformName: string) => WebGLUniformLocation ; pollFence: (fenceContext: FenceContext) => Promise<void> ; pollItems: () => void ; setInputMatrixTexture: (inputMatrixTexture: WebGLTexture, uniformLocation: WebGLUniformLocation, textureUnit: number) => void ; setOutputMatrixTexture: (outputMatrixTexture: WebGLTexture, rows: number, columns: number) => void ; setOutputMatrixWriteRegion: (startRow: number, numRows: number, startColumn: number, numColumns: number) => void ; setOutputPackedMatrixTexture: (outputPackedMatrixTexture: WebGLTexture, rows: number, columns: number) => void ; setOutputPackedMatrixWriteRegion: (startRow: number, numRows: number, startColumn: number, numColumns: number) => void ; setProgram: (program: null | GPGPUContextProgram) => void ; uploadDenseMatrixToTexture: (texture: WebGLTexture, width: number, height: number, data: TypedArray) => void ; uploadPixelDataToTexture: (texture: WebGLTexture, pixels: HTMLCanvasElement | HTMLImageElement | ImageBitmap | ImageData | PixelData) => void ; waitForQueryAndGetTime: (query: WebGLQuery) => Promise<number> } ; MathBackendWebGL: { constructor: (gpuResource?: HTMLCanvasElement | OffscreenCanvas | GPGPUContext) => MathBackendWebGL ; acquireTexture: any ; activeTimers: any ; binaryCache: any ; canvas: any ; checkCompletionAsync_: any ; checkCompletion_: any ; checkNumericalProblems: any ; computeBytes: any ; convertAndCacheOnCPU: any ; dataRefCount: WeakMap<object, number> ; decode: any ; disposed: any ; downloadWaitMs: any ; endTimer: any ; floatPrecisionValue: any ; getAndSaveBinary: any ; getQueryTime: any ; getValuesFromTexture: any ; gpgpu: GPGPUContext ; gpgpuCreatedLocally: any ; lastGlFlushTime: any ; makeOutput: any ; nextDataId: any ; numBytesInGPU: any ; numMBBeforeWarning: any ; packedReshape: any ; packedUnaryOp: any ; pendingDeletes: any ; pendingDisposal: any ; pendingRead: any ; programTimersStack: any ; releaseGPUData: any ; startTimer: any ; texData: DataStorage<TextureData> ; textureManager: any ; uploadWaitMs: any ; warnedAboutMemory: any ; nextDataId: any ; abs: <T>(x: T) => T ; bufferSync: <R, D>(t: TensorInfo) => TensorBuffer<R, D> ; checkCompileCompletion: () => void ; checkCompileCompletionAsync: () => Promise<boolean[]> ; compileAndRun: (program: GPGPUProgram, inputs: TensorInfo[], outputDtype?: keyof DataTypeMap, customUniformValues?: number[][], preventEagerUnpackingOfOutput?: boolean) => TensorInfo ; createTensorFromGPUData: (values: WebGLData, shape: number[], dtype: keyof DataTypeMap) => Tensor<Rank> ; decRef: (dataId: object) => void ; dispose: () => void ; disposeData: (dataId: object, force?: boolean) => boolean ; disposeIntermediateTensorInfo: (tensorInfo: TensorInfo) => void ; epsilon: () => number ; floatPrecision: () => 16 | 32 ; getDataInfo: (dataId: object) => TextureData ; getGPGPUContext: () => GPGPUContext ; getTexture: (dataId: object) => WebGLTexture ; getTextureManager: () => TextureManager ; getUniformLocations: () => void ; incRef: (dataId: object) => void ; makeTensorInfo: (shape: number[], dtype: keyof DataTypeMap, values?: string[] | BackendValues) => TensorInfo ; memory: () => WebGLMemoryInfo ; move: (dataId: object, values: BackendValues, shape: number[], dtype: keyof DataTypeMap, refCount: number) => void ; numDataIds: () => number ; packTensor: (input: TensorInfo) => TensorInfo ; read: (dataId: object) => Promise<BackendValues> ; readSync: (dataId: object) => BackendValues ; readToGPU: (dataId: object, options?: DataToGPUWebGLOption) => GPUData ; refCount: (dataId: object) => number ; runWebGLProgram: (program: GPGPUProgram, inputs: TensorInfo[], outputDtype: keyof DataTypeMap, customUniformValues?: number[][], preventEagerUnpackingOfOutput?: boolean, customTexShape?: [number, number]) => TensorInfo ; shouldExecuteOnCPU: (inputs: TensorInfo[], sizeThreshold?: number) => boolean ; time: (f: () => void) => Promise<WebGLTimingInfo> ; timerAvailable: () => boolean ; unpackTensor: (input: TensorInfo) => TensorInfo ; uploadToGPU: (dataId: object) => void ; where: (condition: Tensor<Rank>) => Tensor2D ; write: (values: BackendValues, shape: number[], dtype: keyof DataTypeMap) => object ; writeTexture: (texture: WebGLTexture, shape: number[], dtype: keyof DataTypeMap, texHeight: number, texWidth: number, channels: string) => object } ; GPGPUProgram: { customUniforms?: { arrayIndex?: number ; name: string ; type: UniformType }[] ; enableShapeUniforms?: boolean ; outPackingScheme?: PackingScheme ; outTexUsage?: TextureUsage ; outputShape: number[] ; packedInputs?: boolean ; packedOutput?: boolean ; userCode: string ; variableNames: string[] } ; WebGLMemoryInfo: { numBytes: number ; numBytesInGPU: number ; numBytesInGPUAllocated: number ; numBytesInGPUFree: number ; numDataBuffers: number ; numTensors: number ; reasons: string[] ; unreliable: boolean } ; WebGLTimingInfo: { downloadWaitMs: number ; kernelMs: number | { error: string } ; uploadWaitMs: number ; wallMs: number ; getExtraProfileInfo?: () => string } ; version_webgl: "4.13.0" = "4.13.0"; webgl: { forceHalfFloat: typeof forceHalfFloat } ; forceHalfFloat: () => void ; setWebGLContext: (webGLVersion: number, gl: WebGLRenderingContext) => void } ; webgl: { forceHalfFloat: () => void } ; forceHalfFloat: () => void }>

Returns

Promise<{ GPGPUContext: ; MathBackendWebGL: ; gpgpu_util: ; setWebGLContext: ; version_webgl: ; webgl_util: ; default: { gpgpu_util: { bindVertexProgramAttributeStreams: (gl: WebGLRenderingContext, program: WebGLProgram, vertexBuffer: WebGLBuffer) => boolean ; createBufferFromOutputTexture: (gl2: WebGL2RenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => WebGLBuffer ; createFloat16MatrixTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Texture ; createFloat16PackedMatrixTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Texture ; createFloat32MatrixTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Texture ; createIndexBuffer: (gl: WebGLRenderingContext) => WebGLBuffer ; createPackedMatrixTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Texture ; createUnsignedBytesMatrixTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Texture ; createVertexBuffer: (gl: WebGLRenderingContext) => WebGLBuffer ; createVertexShader: (gl: WebGLRenderingContext) => WebGLShader ; downloadByteEncodedFloatMatrixFromOutputTexture: (gl: WebGLRenderingContext, rows: number, columns: number, textureConfig: TextureConfig) => Float32Array ; downloadFloat32MatrixFromBuffer: (gl: WebGLRenderingContext, buffer: WebGLBuffer, size: number) => Float32Array ; downloadMatrixFromPackedOutputTexture: (gl: WebGLRenderingContext, physicalRows: number, physicalCols: number) => Float32Array ; downloadPackedMatrixFromBuffer: (gl: WebGLRenderingContext, buffer: WebGLBuffer, batch: number, rows: number, cols: number, physicalRows: number, physicalCols: number, textureConfig: TextureConfig) => Float32Array ; getInternalFormatForFloat16MatrixTexture: (textureConfig: TextureConfig) => number ; getInternalFormatForFloat16PackedMatrixTexture: (textureConfig: TextureConfig) => number ; getInternalFormatForFloat32MatrixTexture: (textureConfig: TextureConfig) => number ; getInternalFormatForPackedMatrixTexture: (textureConfig: TextureConfig) => number ; getInternalFormatForUnsignedBytesMatrixTexture: (textureConfig: TextureConfig) => number ; uploadDenseMatrixToTexture: (gl: WebGLRenderingContext, texture: WebGLTexture, width: number, height: number, data: TypedArray, textureConfig: TextureConfig) => void ; uploadPixelDataToTexture: (gl: WebGLRenderingContext, texture: WebGLTexture, pixels: HTMLCanvasElement | HTMLVideoElement | HTMLImageElement | ImageBitmap | ImageData | PixelData) => void } ; webgl_util: { assertNotComplex: (tensor: TensorInfo | TensorInfo[], opName: string) => void ; bindCanvasToFramebuffer: (gl: WebGLRenderingContext) => void ; bindColorTextureToFramebuffer: (gl: WebGLRenderingContext, texture: WebGLTexture, framebuffer: WebGLFramebuffer) => void ; bindTextureToProgramUniformSampler: (gl: WebGLRenderingContext, texture: WebGLTexture, uniformSamplerLocation: WebGLUniformLocation, textureUnit: number) => void ; bindTextureUnit: (gl: WebGLRenderingContext, texture: WebGLTexture, textureUnit: number) => void ; bindVertexBufferToProgramAttribute: (gl: WebGLRenderingContext, program: WebGLProgram, attribute: string, buffer: WebGLBuffer, arrayEntriesPerItem: number, itemStrideInBytes: number, itemOffsetInBytes: number) => boolean ; callAndCheck: <T>(gl: WebGLRenderingContext, func: () => T) => T ; canBeRepresented: (num: number) => boolean ; createFragmentShader: (gl: WebGLRenderingContext, fragmentShaderSource: string) => WebGLShader ; createFramebuffer: (gl: WebGLRenderingContext) => WebGLFramebuffer ; createProgram: (gl: WebGLRenderingContext) => WebGLProgram ; createStaticIndexBuffer: (gl: WebGLRenderingContext, data: Uint16Array) => WebGLBuffer ; createStaticVertexBuffer: (gl: WebGLRenderingContext, data: Float32Array) => WebGLBuffer ; createTexture: (gl: WebGLRenderingContext) => WebGLTexture ; createVertexShader: (gl: WebGLRenderingContext, vertexShaderSource: string) => WebGLShader ; getBatchDim: (shape: number[], dimsToSkip?: number) => number ; getExtensionOrThrow: (gl: WebGLRenderingContext, extensionName: string) => {} ; getFramebufferErrorMessage: (gl: WebGLRenderingContext, status: number) => string ; getMaxTexturesInShader: (webGLVersion: number) => number ; getNumChannels: () => number ; getProgramUniformLocation: (gl: WebGLRenderingContext, program: WebGLProgram, uniformName: string) => WebGLUniformLocation ; getProgramUniformLocationOrThrow: (gl: WebGLRenderingContext, program: WebGLProgram, uniformName: string) => WebGLUniformLocation ; getRowsCols: (shape: number[]) => [number, number] ; getShapeAs3D: (shape: number[]) => [number, number, number] ; getTextureShapeFromLogicalShape: (logShape: number[], isPacked?: boolean) => [number, number] ; getWebGLDisjointQueryTimerVersion: (webGLVersion: number) => number ; getWebGLErrorMessage: (gl: WebGLRenderingContext, status: number) => string ; getWebGLMaxTextureSize: (webGLVersion: number) => number ; hasExtension: (gl: WebGLRenderingContext, extensionName: string) => boolean ; isCapableOfRenderingToFloatTexture: (webGLVersion: number) => boolean ; isDownloadFloatTextureEnabled: (webGLVersion: number) => boolean ; isReshapeFree: (shape1: number[], shape2: number[]) => boolean ; isWebGLFenceEnabled: (webGLVersion: number) => boolean ; isWebGLVersionEnabled: (webGLVersion: 2 | 1) => boolean ; linkProgram: (gl: WebGLRenderingContext, program: WebGLProgram) => void ; logShaderSourceAndInfoLog: (shaderSource: string, shaderInfoLog: string) => void ; resetMaxTextureSize: () => void ; resetMaxTexturesInShader: () => void ; unbindColorTextureFromFramebuffer: (gl: WebGLRenderingContext, framebuffer: WebGLFramebuffer) => void ; unbindTextureUnit: (gl: WebGLRenderingContext, textureUnit: number) => void ; validateFramebuffer: (gl: WebGLRenderingContext) => void ; validateProgram: (gl: WebGLRenderingContext, program: WebGLProgram) => void ; validateTextureSize: (width: number, height: number) => void } ; GPGPUContext: { constructor: (gl?: WebGLRenderingContext) => GPGPUContext ; addItemToPoll: any ; bindTextureToFrameBuffer: any ; bindVertexArray: (vao: null | WebGLVao) => void ; colorBufferFloatExtension: {} ; colorBufferHalfFloatExtension: {} ; createFence: any ; createVertexArray: () => null | WebGLVao ; deleteVertexArray: (vao: null | WebGLVao) => void ; disjoint: any ; disjointQueryTimerExtension: WebGL2DisjointQueryTimerExtension | WebGL1DisjointQueryTimerExtension ; disposed: any ; downloadMatrixDriver: any ; framebuffer: WebGLFramebuffer ; getQueryTime: any ; getQueryTimerExtension: any ; getQueryTimerExtensionWebGL1: any ; getQueryTimerExtensionWebGL2: any ; getVertexArray: () => null | WebGLVao ; gl: WebGLRenderingContext ; indexBuffer: WebGLBuffer ; isQueryAvailable: any ; itemsToPoll: any ; outputTexture: null | WebGLTexture ; parallelCompilationExtension: WebGLParallelCompilationExtension ; program: null | GPGPUContextProgram ; setOutputMatrixTextureDriver: any ; setOutputMatrixWriteRegionDriver: any ; textureConfig: TextureConfig ; textureFloatExtension: {} ; textureHalfFloatExtension: {} ; throwIfDisposed: any ; throwIfNoProgram: any ; unbindTextureToFrameBuffer: any ; vertexBuffer: WebGLBuffer ; vertexShader: any ; debug: ; beginQuery: () => WebGLQuery ; blockUntilAllProgramsCompleted: () => void ; buildVao: (program: GPGPUContextProgram) => void ; createAndWaitForFence: () => Promise<void> ; createBufferFromTexture: (texture: WebGLTexture, rows: number, columns: number) => WebGLBuffer ; createFloat16MatrixTexture: (rows: number, columns: number) => Texture ; createFloat16PackedMatrixTexture: (rows: number, columns: number) => Texture ; createFloat32MatrixTexture: (rows: number, columns: number) => Texture ; createPackedMatrixTexture: (rows: number, columns: number) => Texture ; createProgram: (fragmentShader: WebGLShader) => GPGPUContextProgram ; createUnsignedBytesMatrixTexture: (rows: number, columns: number) => Texture ; debugValidate: () => void ; deleteMatrixTexture: (texture: WebGLTexture) => void ; deleteProgram: (program: GPGPUContextProgram) => void ; dispose: () => void ; downloadByteEncodedFloatMatrixFromOutputTexture: (texture: WebGLTexture, rows: number, columns: number) => Float32Array ; downloadFloat32MatrixFromBuffer: (buffer: WebGLBuffer, size: number) => Float32Array ; downloadMatrixFromPackedTexture: (texture: WebGLTexture, physicalRows: number, physicalCols: number) => Float32Array ; downloadPackedMatrixFromBuffer: (buffer: WebGLBuffer, batch: number, rows: number, columns: number, physicalRows: number, physicalCols: number) => Float32Array ; endQuery: () => void ; executeProgram: () => void ; getAttributeLocation: (program: WebGLProgram, attribute: string) => number ; getUniformLocation: (program: WebGLProgram, uniformName: string, shouldThrow?: boolean) => WebGLUniformLocation ; getUniformLocationNoThrow: (program: WebGLProgram, uniformName: string) => WebGLUniformLocation ; pollFence: (fenceContext: FenceContext) => Promise<void> ; pollItems: () => void ; setInputMatrixTexture: (inputMatrixTexture: WebGLTexture, uniformLocation: WebGLUniformLocation, textureUnit: number) => void ; setOutputMatrixTexture: (outputMatrixTexture: WebGLTexture, rows: number, columns: number) => void ; setOutputMatrixWriteRegion: (startRow: number, numRows: number, startColumn: number, numColumns: number) => void ; setOutputPackedMatrixTexture: (outputPackedMatrixTexture: WebGLTexture, rows: number, columns: number) => void ; setOutputPackedMatrixWriteRegion: (startRow: number, numRows: number, startColumn: number, numColumns: number) => void ; setProgram: (program: null | GPGPUContextProgram) => void ; uploadDenseMatrixToTexture: (texture: WebGLTexture, width: number, height: number, data: TypedArray) => void ; uploadPixelDataToTexture: (texture: WebGLTexture, pixels: HTMLCanvasElement | HTMLImageElement | ImageBitmap | ImageData | PixelData) => void ; waitForQueryAndGetTime: (query: WebGLQuery) => Promise<number> } ; MathBackendWebGL: { constructor: (gpuResource?: HTMLCanvasElement | OffscreenCanvas | GPGPUContext) => MathBackendWebGL ; acquireTexture: any ; activeTimers: any ; binaryCache: any ; canvas: any ; checkCompletionAsync_: any ; checkCompletion_: any ; checkNumericalProblems: any ; computeBytes: any ; convertAndCacheOnCPU: any ; dataRefCount: WeakMap<object, number> ; decode: any ; disposed: any ; downloadWaitMs: any ; endTimer: any ; floatPrecisionValue: any ; getAndSaveBinary: any ; getQueryTime: any ; getValuesFromTexture: any ; gpgpu: GPGPUContext ; gpgpuCreatedLocally: any ; lastGlFlushTime: any ; makeOutput: any ; nextDataId: any ; numBytesInGPU: any ; numMBBeforeWarning: any ; packedReshape: any ; packedUnaryOp: any ; pendingDeletes: any ; pendingDisposal: any ; pendingRead: any ; programTimersStack: any ; releaseGPUData: any ; startTimer: any ; texData: DataStorage<TextureData> ; textureManager: any ; uploadWaitMs: any ; warnedAboutMemory: any ; nextDataId: any ; abs: <T>(x: T) => T ; bufferSync: <R, D>(t: TensorInfo) => TensorBuffer<R, D> ; checkCompileCompletion: () => void ; checkCompileCompletionAsync: () => Promise<boolean[]> ; compileAndRun: (program: GPGPUProgram, inputs: TensorInfo[], outputDtype?: keyof DataTypeMap, customUniformValues?: number[][], preventEagerUnpackingOfOutput?: boolean) => TensorInfo ; createTensorFromGPUData: (values: WebGLData, shape: number[], dtype: keyof DataTypeMap) => Tensor<Rank> ; decRef: (dataId: object) => void ; dispose: () => void ; disposeData: (dataId: object, force?: boolean) => boolean ; disposeIntermediateTensorInfo: (tensorInfo: TensorInfo) => void ; epsilon: () => number ; floatPrecision: () => 16 | 32 ; getDataInfo: (dataId: object) => TextureData ; getGPGPUContext: () => GPGPUContext ; getTexture: (dataId: object) => WebGLTexture ; getTextureManager: () => TextureManager ; getUniformLocations: () => void ; incRef: (dataId: object) => void ; makeTensorInfo: (shape: number[], dtype: keyof DataTypeMap, values?: string[] | BackendValues) => TensorInfo ; memory: () => WebGLMemoryInfo ; move: (dataId: object, values: BackendValues, shape: number[], dtype: keyof DataTypeMap, refCount: number) => void ; numDataIds: () => number ; packTensor: (input: TensorInfo) => TensorInfo ; read: (dataId: object) => Promise<BackendValues> ; readSync: (dataId: object) => BackendValues ; readToGPU: (dataId: object, options?: DataToGPUWebGLOption) => GPUData ; refCount: (dataId: object) => number ; runWebGLProgram: (program: GPGPUProgram, inputs: TensorInfo[], outputDtype: keyof DataTypeMap, customUniformValues?: number[][], preventEagerUnpackingOfOutput?: boolean, customTexShape?: [number, number]) => TensorInfo ; shouldExecuteOnCPU: (inputs: TensorInfo[], sizeThreshold?: number) => boolean ; time: (f: () => void) => Promise<WebGLTimingInfo> ; timerAvailable: () => boolean ; unpackTensor: (input: TensorInfo) => TensorInfo ; uploadToGPU: (dataId: object) => void ; where: (condition: Tensor<Rank>) => Tensor2D ; write: (values: BackendValues, shape: number[], dtype: keyof DataTypeMap) => object ; writeTexture: (texture: WebGLTexture, shape: number[], dtype: keyof DataTypeMap, texHeight: number, texWidth: number, channels: string) => object } ; GPGPUProgram: { customUniforms?: { arrayIndex?: number ; name: string ; type: UniformType }[] ; enableShapeUniforms?: boolean ; outPackingScheme?: PackingScheme ; outTexUsage?: TextureUsage ; outputShape: number[] ; packedInputs?: boolean ; packedOutput?: boolean ; userCode: string ; variableNames: string[] } ; WebGLMemoryInfo: { numBytes: number ; numBytesInGPU: number ; numBytesInGPUAllocated: number ; numBytesInGPUFree: number ; numDataBuffers: number ; numTensors: number ; reasons: string[] ; unreliable: boolean } ; WebGLTimingInfo: { downloadWaitMs: number ; kernelMs: number | { error: string } ; uploadWaitMs: number ; wallMs: number ; getExtraProfileInfo?: () => string } ; version_webgl: "4.13.0" = "4.13.0"; webgl: { forceHalfFloat: typeof forceHalfFloat } ; forceHalfFloat: () => void ; setWebGLContext: (webGLVersion: number, gl: WebGLRenderingContext) => void } ; webgl: { forceHalfFloat: () => void } ; forceHalfFloat: () => void }>


loadTfjsCore

loadTfjsCore(prodMode): Promise<unknown>

Parameters

NameType
prodModeboolean

Returns

Promise<unknown>


loadWasms

loadWasms(paths): Promise<void>

Parameters

NameType
pathsWasmPaths[]

Returns

Promise<void>


pow

pow(exponent): (base: number) => number

pow function from Math in functional form number -> number -> number

Parameters

NameTypeDescription
exponentnumberThe exponent used for the expression

Returns

fn

Math.pow(base, exponent)

▸ (base): number

Parameters
NameType
basenumber
Returns

number


processAverageVolume

processAverageVolume(data): number

Calculate the averaged volume using Root Mean Square, assuming the data is in float form

Parameters

NameTypeDescription
datanumber[]Audio Frequency data

Returns

number


resumeAudioOnInterruption

resumeAudioOnInterruption(audioContext): () => void

Resume the stream whenever interrupted

Parameters

NameTypeDescription
audioContextAudioContextAudioContext

Returns

fn

▸ (): void

Returns

void


resumeAudioOnUnmute

resumeAudioOnUnmute(context): (track: MediaStreamTrack) => Unsubscribe

Resume the AudioContext whenever the source track is unmuted

Parameters

NameType
contextAudioContext

Returns

fn

▸ (track): Unsubscribe

Parameters
NameTypeDescription
trackMediaStreamTrackThe source track to listen on the unmute event
Returns

Unsubscribe


rms

rms(nums): number

Calculate the Root Mean Square from provided numbers

Parameters

NameTypeDescription
numsnumber[]An array of numbers

Returns

number


round

round(num): number

Round the floating point number away from zero, which is different from Math.round

Example

round(0.5) // 1
round(-0.5) // -1

Parameters

NameTypeDescription
numnumberThe number to round

Returns

number


subscribeTimeoutAnalyzerNode

subscribeTimeoutAnalyzerNode(analyzer, options): () => void

Subscribe to a timeout loop to get the data from Analyzer

Parameters

NameTypeDescription
analyzerAnalyzerthe analyzer to subscribe
optionsAnalyzerSubscribableOptionsmessage handler, etc.

Returns

fn

▸ (): void

Returns

void


subscribeWorkletNode

subscribeWorkletNode<T>(workletNode, options?): () => void

Subscribe MessagePort message from an AudioWorkletNode

Type parameters

Name
T

Parameters

NameTypeDescription
workletNodeAudioWorkletNodethe node to subscribe
optionsPartial<WorkletMessagePortOptions<T>>can pass a message handler here to handle the message

Returns

fn

▸ (): void

Returns

void


sum

sum(nums): number

Sum an array of numbers

Parameters

NameTypeDescription
numsnumber[]An array of numbers

Returns

number


toDecibel

toDecibel(gain): number

Convert a floating point gain value into a dB representation without any reference, dBFS, https://en.wikipedia.org/wiki/DBFS

See https://www.w3.org/TR/webaudio#conversion-to-db

Parameters

NameType
gainnumber

Returns

number